--- title: Robots.txt Mistakes That Block AI Assistants slug: robots.txt mistakes --- ### Guide ### Common Errors Robots.txt mistakes often block assistants from reading structured data folders. ### Top Issues to Avoid - Using “Disallow /” on root folders. - Forgetting to allow JSON or YAML directories. - Leaving test or placeholder domains listed. ### How to Fix Them Edit robots.txt to permit AI crawlers and reference updated sitemaps at the bottom of the file.